Mixtures of truncated basis functions
نویسندگان
چکیده
In this paper we propose a framework, called mixtures of truncated basis functions (MoTBFs), for representing general hybrid Bayesian networks. The proposed framework generalizes both the mixture of truncated exponentials (MTEs) framework and the mixture of polynomials (MoPs) framework. Similar to MTEs and MoPs, MoTBFs are defined so that the potentials are closed under combination and marginalization, which ensures that inference in MoTBF networks can be performed efficiently using the Shafer-Shenoy architecture. Based on a generalized Fourier series approximation, we devise a method for efficiently approximating an arbitrary density function using the MoTBF framework. The translation method is more flexible than existing MTE or MoP-based methods, and it supports an online/anytime tradeoff between the accuracy and the complexity of the approximation. Experimental results show that the approximations obtained are either comparable or significantly better than the approximations obtained using existing methods.
منابع مشابه
Learning Conditional Distributions Using Mixtures of Truncated Basis Functions
Mixtures of Truncated Basis Functions (MoTBFs) have recently been proposed for modelling univariate and joint distributions in hybrid Bayesian networks. In this paper we analyse the problem of learning conditional MoTBF distributions from data. Our approach utilizes a new technique for learning joint MoTBF densities, then propose a method for using these to generate the conditional distribution...
متن کاملMODELING OF FLOW NUMBER OF ASPHALT MIXTURES USING A MULTI–KERNEL BASED SUPPORT VECTOR MACHINE APPROACH
Flow number of asphalt–aggregate mixtures as an explanatory factor has been proposed in order to assess the rutting potential of asphalt mixtures. This study proposes a multiple–kernel based support vector machine (MK–SVM) approach for modeling of flow number of asphalt mixtures. The MK–SVM approach consists of weighted least squares–support vector machine (WLS–SVM) integrating two kernel funct...
متن کاملLearning Mixtures of Truncated Basis Functions from Data
In this paper we describe a new method for learning hybrid Bayesian network models from data. The method utilizes a kernel density estimator, which is in turn “translated” into a mixture of truncated basis functions-representation using a convex optimization technique. We argue that these estimators approximate the maximum likelihood estimators, and compare our approach to previous attempts at ...
متن کاملRecurrence Relations for Moment Generating Functions of Generalized Order Statistics Based on Doubly Truncated Class of Distributions
In this paper, we derived recurrence relations for joint moment generating functions of nonadjacent generalized order statistics (GOS) of random samples drawn from doubly truncated class of continuous distributions. Recurrence relations for joint moments of nonadjacent GOS (ordinary order statistics (OOS) and k-upper records (k-RVs) as special cases) are obtained. Single and product moment gene...
متن کاملLearning mixtures of polynomials from data using B-spline interpolation
Hybrid Bayesian networks efficiently encode a joint probability distribution over a set of continuous and discrete variables. Several approaches have been recently proposed for working with hybrid Bayesian networks, e.g., mixtures of truncated basis functions, mixtures of truncated exponentials or mixtures of polynomials (MoPs). We present a method for learning MoP approximations of probability...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Int. J. Approx. Reasoning
دوره 53 شماره
صفحات -
تاریخ انتشار 2012